AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Dynamic masking training

# Dynamic masking training

Roberta Base Mr
A transformers model pre-trained on large-scale Marathi corpus using self-supervised learning, primarily for masked language modeling and downstream task fine-tuning
Large Language Model
R
flax-community
156
1
Roberta Small Bulgarian
This is a streamlined version of the Bulgarian RoBERTa model, containing only 6 hidden layers while maintaining comparable performance.
Large Language Model Other
R
iarfmoose
21
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase